183 research outputs found

    The relationship between magnetic and electrophysiological responses to complex tactile stimuli.

    Get PDF
    Background Magnetoencephalography (MEG) has become an increasingly popular technique for non-invasively characterizing neuromagnetic field changes in the brain at a high temporal resolution. To examine the reliability of the MEG signal, we compared magnetic and electrophysiological responses to complex natural stimuli from the same animals. We examined changes in neuromagnetic fields, local field potentials (LFP) and multi-unit activity (MUA) in macaque monkey primary somatosensory cortex that were induced by varying the rate of mechanical stimulation. Stimuli were applied to the fingertips with three inter-stimulus intervals (ISIs): 0.33s, 1s and 2s. Results Signal intensity was inversely related to the rate of stimulation, but to different degrees for each measurement method. The decrease in response at higher stimulation rates was significantly greater for MUA than LFP and MEG data, while no significant difference was observed between LFP and MEG recordings. Furthermore, response latency was the shortest for MUA and the longest for MEG data. Conclusion The MEG signal is an accurate representation of electrophysiological responses to complex natural stimuli. Further, the intensity and latency of the MEG signal were better correlated with the LFP than MUA data suggesting that the MEG signal reflects primarily synaptic currents rather than spiking activity. These differences in latency could be attributed to differences in the extent of spatial summation and/or differential laminar sensitivity

    Beneficial effects of word final stress in segmenting a new language: evidence from ERPs

    Get PDF
    Background: How do listeners manage to recognize words in an unfamiliar language? The physical continuity of the signal, in which real silent pauses between words are lacking, makes it a difficult task. However, there are multiple cues that can be exploited to localize word boundaries and to segment the acoustic signal. In the present study, word-stress was manipulated with statistical information and placed in different syllables within trisyllabic nonsense words to explore the result of the combination of the cues in an online word segmentation task. Results: The behavioral results showed that words were segmented better when stress was placed on the final syllables than when it was placed on the middle or first syllable. The electrophysiological results showed an increase in the amplitude of the P2 component, which seemed to be sensitive to word-stress and its location within words. Conclusion: The results demonstrated that listeners can integrate specific prosodic and distributional cues when segmenting speech. An ERP component related to word-stress cues was identified: stressed syllables elicited larger amplitudes in the P2 component than unstressed ones

    Factors Affecting Frequency Discrimination of Vibrotactile Stimuli: Implications for Cortical Encoding

    Get PDF
    BACKGROUND: Measuring perceptual judgments about stimuli while manipulating their physical characteristics can uncover the neural algorithms underlying sensory processing. We carried out psychophysical experiments to examine how humans discriminate vibrotactile stimuli. METHODOLOGY/PRINCIPAL FINDINGS: Subjects compared the frequencies of two sinusoidal vibrations applied sequentially to one fingertip. Performance was reduced when (1) the root mean square velocity (or energy) of the vibrations was equated by adjusting their amplitudes, and (2) the vibrations were noisy (their temporal structure was irregular). These effects were super-additive when subjects compared noisy vibrations that had equal velocity, indicating that frequency judgments became more dependent on the vibrations' temporal structure when differential information about velocity was eliminated. To investigate which areas of the somatosensory system use information about velocity and temporal structure, we required subjects to compare vibrations applied sequentially to opposite hands. This paradigm exploits the fact that tactile input to neurons at early levels (e.g., the primary somatosensory cortex, SI) is largely confined to the contralateral side of the body, so these neurons are less able to contribute to vibration comparisons between hands. The subjects' performance was still sensitive to differences in vibration velocity, but became less sensitive to noise. CONCLUSIONS/SIGNIFICANCE: We conclude that vibration frequency is represented in different ways by different mechanisms distributed across multiple cortical regions. Which mechanisms support the “readout” of frequency varies according to the information present in the vibration. Overall, the present findings are consistent with a model in which information about vibration velocity is coded in regions beyond SI. While adaptive processes within SI also contribute to the representation of frequency, this adaptation is influenced by the temporal regularity of the vibration

    Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion

    Get PDF
    Previous studies have shown that in tasks requiring participants to report the direction of apparent motion, task-irrelevant mono-beeps can "capture'' visual motion perception when the beeps occur temporally close to the visual stimuli. However, the contributions of the relative timing of multimodal events and the event structure, modulating uni- and/or crossmodal perceptual grouping, remain unclear. To examine this question and extend the investigation to the tactile modality, the current experiments presented tactile two-tap apparent-motion streams, with an SOA of 400 ms between successive, left-/right-hand middle-finger taps, accompanied by task-irrelevant, non-spatial auditory stimuli. The streams were shown for 90 seconds, and participants' task was to continuously report the perceived (left-or rightward) direction of tactile motion. In Experiment 1, each tactile stimulus was paired with an auditory beep, though odd-numbered taps were paired with an asynchronous beep, with audiotactile SOAs ranging from -75 ms to 75 ms. Perceived direction of tactile motion varied systematically with audiotactile SOA, indicative of a temporal-capture effect. In Experiment 2, two audiotactile SOAs-one short (75 ms), one long (325 ms)-were compared. The long-SOA condition preserved the crossmodal event structure (so the temporal-capture dynamics should have been similar to that in Experiment 1), but both beeps now occurred temporally close to the taps on one side (even-numbered taps). The two SOAs were found to produce opposite modulations of apparent motion, indicative of an influence of crossmodal grouping. In Experiment 3, only odd-numbered, but not even-numbered, taps were paired with auditory beeps. This abolished the temporal-capture effect and, instead, a dominant percept of apparent motion from the audiotactile side to the tactile-only side was observed independently of the SOA variation. These findings suggest that asymmetric crossmodal grouping leads to an attentional modulation of apparent motion, which inhibits crossmodal temporal-capture effects

    Cross-Modal Distortion of Time Perception: Demerging the Effects of Observed and Performed Motion

    Get PDF
    Temporal information is often contained in multi-sensory stimuli, but it is currently unknown how the brain combines e.g. visual and auditory cues into a coherent percept of time. The existing studies of cross-modal time perception mainly support the “modality appropriateness hypothesis”, i.e. the domination of auditory temporal cues over visual ones because of the higher precision of audition for time perception. However, these studies suffer from methodical problems and conflicting results. We introduce a novel experimental paradigm to examine cross-modal time perception by combining an auditory time perception task with a visually guided motor task, requiring participants to follow an elliptic movement on a screen with a robotic manipulandum. We find that subjective duration is distorted according to the speed of visually observed movement: The faster the visual motion, the longer the perceived duration. In contrast, the actual execution of the arm movement does not contribute to this effect, but impairs discrimination performance by dual-task interference. We also show that additional training of the motor task attenuates the interference, but does not affect the distortion of subjective duration. The study demonstrates direct influence of visual motion on auditory temporal representations, which is independent of attentional modulation. At the same time, it provides causal support for the notion that time perception and continuous motor timing rely on separate mechanisms, a proposal that was formerly supported by correlational evidence only. The results constitute a counterexample to the modality appropriateness hypothesis and are best explained by Bayesian integration of modality-specific temporal information into a centralized “temporal hub”

    Sound Frequency and Aural Selectivity in Sound-Contingent Visual Motion Aftereffect

    Get PDF
    BACKGROUND: One possible strategy to evaluate whether signals in different modalities originate from a common external event or object is to form associations between inputs from different senses. This strategy would be quite effective because signals in different modalities from a common external event would then be aligned spatially and temporally. Indeed, it has been demonstrated that after adaptation to visual apparent motion paired with alternating auditory tones, the tones begin to trigger illusory motion perception to a static visual stimulus, where the perceived direction of visual lateral motion depends on the order in which the tones are replayed. The mechanisms underlying this phenomenon remain unclear. One important approach to understanding the mechanisms is to examine whether the effect has some selectivity in auditory processing. However, it has not yet been determined whether this aftereffect can be transferred across sound frequencies and between ears. METHODOLOGY/PRINCIPAL FINDINGS: Two circles placed side by side were presented in alternation, producing apparent motion perception, and each onset was accompanied by a tone burst of a specific and unique frequency. After exposure to this visual apparent motion with tones for a few minutes, the tones became drivers for illusory motion perception. However, the aftereffect was observed only when the adapter and test tones were presented at the same frequency and to the same ear. CONCLUSIONS/SIGNIFICANCE: These findings suggest that the auditory processing underlying the establishment of novel audiovisual associations is selective, potentially but not necessarily indicating that this processing occurs at an early stage

    Distortions of Subjective Time Perception Within and Across Senses

    Get PDF
    Background: The ability to estimate the passage of time is of fundamental importance for perceptual and cognitive processes. One experience of time is the perception of duration, which is not isomorphic to physical duration and can be distorted by a number of factors. Yet, the critical features generating these perceptual shifts in subjective duration are not understood. Methodology/Findings: We used prospective duration judgments within and across sensory modalities to examine the effect of stimulus predictability and feature change on the perception of duration. First, we found robust distortions of perceived duration in auditory, visual and auditory-visual presentations despite the predictability of the feature changes in the stimuli. For example, a looming disc embedded in a series of steady discs led to time dilation, whereas a steady disc embedded in a series of looming discs led to time compression. Second, we addressed whether visual (auditory) inputs could alter the perception of duration of auditory (visual) inputs. When participants were presented with incongruent audio-visual stimuli, the perceived duration of auditory events could be shortened or lengthened by the presence of conflicting visual information; however, the perceived duration of visual events was seldom distorted by the presence of auditory information and was never perceived shorter than their actual durations. Conclusions/Significance: These results support the existence of multisensory interactions in the perception of duration and, importantly, suggest that vision can modify auditory temporal perception in a pure timing task. Insofar as distortions in subjective duration can neither be accounted for by the unpredictability of an auditory, visual or auditory-visual event, we propose that it is the intrinsic features of the stimulus that critically affect subjective time distortions

    Benefits of Stimulus Congruency for Multisensory Facilitation of Visual Learning

    Get PDF
    Background. Studies of perceptual learning have largely focused on unisensory stimuli. However, multisensory interactions are ubiquitous in perception, even at early processing stages, and thus can potentially play a role in learning. Here, we examine the effect of auditory-visual congruency on visual learning. Methodology/Principle Findings. Subjects were trained over five days on a visual motion coherence detection task with either congruent audiovisual, or incongruent audiovisual stimuli. Comparing performance on visual-only trials, we find that training with congruent audiovisual stimuli produces significantly better learning than training with incongruent audiovisual stimuli or with only visual stimuli. Conclusions/ Significance. This advantage from stimulus congruency during training suggests that the benefits of multisensory training may result from audiovisual interactions at a perceptual rather than cognitive level

    Symmetric Sensorimotor Somatotopy

    Get PDF
    BACKGROUND: Functional imaging has recently been used to investigate detailed somatosensory organization in human cortex. Such studies frequently assume that human cortical areas are only identifiable insofar as they resemble those measured invasively in monkeys. This is true despite the electrophysiological basis of the latter recordings, which are typically extracellular recordings of action potentials from a restricted sample of cells. METHODOLOGY/PRINCIPAL FINDINGS: Using high-resolution functional magnetic resonance imaging in human subjects, we found a widely distributed cortical response in both primary somatosensory and motor cortex upon pneumatic stimulation of the hairless surface of the thumb, index and ring fingers. Though not organized in a discrete somatotopic fashion, the population activity in response to thumb and index finger stimulation indicated a disproportionate response to fingertip stimulation, and one that was modulated by stimulation direction. Furthermore, the activation was structured with a line of symmetry through the central sulcus reflecting inputs both to primary somatosensory cortex and, precentrally, to primary motor cortex. CONCLUSIONS/SIGNIFICANCE: In considering functional activation that is not somatotopically or anatomically restricted as in monkey electrophysiology studies, our methodology reveals finger-related activation that is not organized in a simple somatotopic manner but is nevertheless as structured as it is widespread. Our findings suggest a striking functional mirroring in cortical areas conventionally ascribed either an input or an output somatotopic function
    corecore